A Learning-Rate Schedule for Stochastic Gradient Methods to Matrix Factorization
نویسندگان
چکیده
Stochastic gradient methods are effective to solve matrix factorization problems. However, it is well known that the performance of stochastic gradient method highly depends on the learning rate schedule used; a good schedule can significantly boost the training process. In this paper, motivated from past works on convex optimization which assign a learning rate for each variable, we propose a new schedule for matrix factorization. The experiments demonstrate that the proposed schedule leads to faster convergence than existing ones. Our schedule uses the same parameter on all data sets included in our experiments; that is, the time spent on learning rate selection can be significantly reduced. By applying this schedule to a state-of-the-art matrix factorization package, the resulting implementation outperforms available parallel matrix factorization packages.
منابع مشابه
Online Passive-Aggressive Algorithms for Non-Negative Matrix Factorization and Completion
Stochastic Gradient Descent (SGD) is a popular online algorithm for large-scale matrix factorization. However, SGD can often be di cult to use for practitioners, because its performance is very sensitive to the choice of the learning rate parameter. In this paper, we present non-negative passiveaggressive (NN-PA), a family of online algorithms for non-negative matrix factorization (NMF). Our al...
متن کاملOnline Kernel Matrix Factorization
The problem of efficiently applying a kernel-induced feature space factorization to a largescale data sets is addressed in this thesis. Kernel matrix factorization methods have showed good performances solving machine learning and data analysis problems. However, the present growth of the amount of information available implies the problems can not be solved with conventional methods, due their...
متن کاملSimulated Annealing with Levy Distribution for Fast Matrix Factorization-Based Collaborative Filtering
Matrix factorization is one of the best approaches for collaborative filtering, because of its high accuracy in presenting users and items latent factors. The main disadvantages of matrix factorization are its complexity, and being very hard to be parallelized, specially with very large matrices. In this paper, we introduce a new method for collaborative filtering based on Matrix Factorization ...
متن کاملStochastic variance reduced multiplicative update for nonnegative matrix factorization
Nonnegative matrix factorization (NMF), a dimensionality reduction and factor analysis method, is a special case in which factor matrices have low-rank nonnegative constraints. Considering the stochastic learning in NMF, we specifically address the multiplicative update (MU) rule, which is the most popular, but which has slow convergence property. This present paper introduces on the stochastic...
متن کاملExploiting k-Degree Locality to Improve Overlapping Community Detection
Community detection is of crucial importance in understanding structures of complex networks. In many real-world networks, communities naturally overlap since a node usually has multiple community memberships. One popular technique to cope with overlapping community detection is Matrix Factorization (MF). However, existing MFbased models have ignored the fact that besides neighbors, “local non-...
متن کامل